Learning of Structurally Unambiguous Probabilistic Grammars
نویسندگان
چکیده
The problem of identifying a probabilistic context free grammar has two aspects: the first is determining grammar's topology (the rules grammar) and second estimating weights for each rule. Given hardness results learning context-free grammars in general, particular, most literature concentrated on problem. In this work we address We restrict attention to structurally unambiguous weighted (SUWCFG) provide query algorithm strucuturally (SUPCFG). show that SUWCFG can be represented using co-linear multiplicity tree automata (CMTA), polynomial learns CMTAs. learned CMTA converted into grammar, thus providing complete strucutrally (both weights) structured membership queries equivalence queries. demonstrate usefulness our PCFGs over genomic data.
منابع مشابه
Structurally Unambiguous Finite Automata
We define a structurally unambiguous finite automaton (SUFA) to be a nondeterministic finite automaton (NFA) with one starting state q0 such that for all input strings w and for any state q, there is at most one path from q0 to q that consumes w. The definition of SUFA differs from the usual definition of an unambiguous finite automaton (UFA) in that the new definition is defined in terms of th...
متن کاملUnsupervised learning of probabilistic grammars
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi CHAPTER
متن کاملUnambiguous Boolean grammars
Boolean grammars are an extension of context-free grammars, in which all propositional connectives are allowed. In this paper, the notion of ambiguity in Boolean grammars is defined. It is shown that the known transformation of a Boolean grammar to the binary normal form preserves unambiguity, and that every unambiguous Boolean language can be parsed in time O(n). Linear conjunctive languages a...
متن کاملLearning restricted probabilistic link grammars
We describe a language model employing a new headeddisjuncts formulationof Lafferty et al.’s (1992)probabilistic link grammar, together with (1) an EM training method for estimating the probabilities, and (2) a procedure for learning some simple lexicalized grammar structures. The model in its simplest form is a generalization of n-gram models, but in its general form possesses context-free exp...
متن کاملCovariance in Unsupervised Learning of Probabilistic Grammars
Probabilistic grammars offer great flexibility in modeling discrete sequential data like natural language text. Their symbolic component is amenable to inspection by humans, while their probabilistic component helps resolve ambiguity. They also permit the use of well-understood, generalpurpose learning algorithms. There has been an increased interest in using probabilistic grammars in the Bayes...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2021
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v35i10.17107